Denoising Multi-Similarity Formulation: A Self-Paced Curriculum-Driven Approach for Robust Metric Learning

نویسندگان

چکیده

Deep Metric Learning (DML) is a group of techniques that aim to measure the similarity between objects through neural network. Although number DML methods has rapidly increased in recent years, most previous studies cannot effectively handle noisy data, which commonly exists practical applications and often leads serious performance deterioration. To overcome this limitation, paper, we build connection samples hard framework self-paced learning, propose Balanced Self-Paced (BSPML) algorithm with denoising multi-similarity formulation, where are treated as extremely adaptively excluded from model training by sample weighting. Especially, due pairwise relationship new balance regularization term, sub-problem w.r.t. weights nonconvex quadratic function. efficiently solve problem, doubly stochastic projection coordinate gradient algorithm. Importantly, theoretically prove convergence not only for algorithm, but also our BSPML Experimental results on several standard data sets demonstrate better generalization ability robustness than state-of-the-art robust approaches.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Self-Paced Curriculum Learning

Curriculum learning (CL) or self-paced learning (SPL) represents a recently proposed learning regime inspired by the learning process of humans and animals that gradually proceeds from easy to more complex samples in training. The two methods share a similar conceptual learning paradigm, but differ in specific learning schemes. In CL, the curriculum is predetermined by prior knowledge, and rema...

متن کامل

Self-Paced Multi-Task Learning

Multi-task learning is a paradigm, where multiple tasks are jointly learnt. Previous multi-task learning models usually treat all tasks and instances per task equally during learning. Inspired by the fact that humans often learn from easy concepts to hard ones in the cognitive process, in this paper, we propose a novel multi-task learning framework that attempts to learn the tasks by simultaneo...

متن کامل

Multi-view Self-Paced Learning for Clustering

Exploiting the information from multiple views can improve clustering accuracy. However, most existing multi-view clustering algorithms are nonconvex and are thus prone to becoming stuck into bad local minima, especially when there are outliers and missing data. To overcome this problem, we present a new multi-view self-paced learning (MSPL) algorithm for clustering, that learns the multi-view ...

متن کامل

ScreenerNet: Learning Self-Paced Curriculum for Deep Neural Networks

We propose to learn a curriculum or a syllabus for supervised learning with deep neural networks. Specifically, we learn weights for each sample in training by an attached neural network, called ScreenerNet, to the original network and jointly train them in an end-to-end fashion. We show the networks augmented with our ScreenerNet achieve early convergence with better accuracy than the state-of...

متن کامل

Robust Sparse Coding via Self-Paced Learning

Sparse coding (SC) is attracting more and more attention due to its comprehensive theoretical studies and its excellent performance in many signal processing applications. However, most existing sparse coding algorithms are nonconvex and are thus prone to becoming stuck into bad local minima, especially when there are outliers and noisy data. To enhance the learning robustness, in this paper, w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i9.26324